Skip to main content

wget command

wget - The non-interactive network downloader.

The wget command in Linux is a powerful tool used to download files from the web via HTTP, HTTPS, or FTP. It’s non-interactive, making it ideal for scripts or downloading files directly in the terminal.

Usage: wget [OPTION]... [URL]...

  • URL: The web address of the file to download.
  • OPTION: Flags to modify behavior (e.g., output file, retries).

Common Options

OptionDescription
-OSpecify output filename
-PSpecify output directory
-cResume a partial download
-iDownload from a list of URLs
--limit-rateLimit download speed (e.g., 200k)
-rRecursive download
-lLimit recursive depth
-bRun in background
-qQuiet mode (no output)

Examples

  • Basic Download

    Download a file from a URL to the current directory.

    wget https://example.com/file.txt
    • Downloads file.txt to your current directory.
    • Shows a progress bar with speed and size.
  • Specifying Output File

    Use -O (uppercase) to save the file with a specific name.

    wget -O myfile.txt https://example.com/file.txt
    • Saves as myfile.txt instead of the default name.
  • Downloading to a Directory

    Use -P to specify a directory for the downloaded file.

    wget -P /home/user/downloads https://example.com/file.txt
    • Saves file.txt to /home/user/downloads.
  • Resuming Downloads

    Use -c to resume a partially downloaded file.

    wget -c https://example.com/largefile.zip
    • Continues downloading largefile.zip if interrupted (e.g., by Ctrl+C).
  • Downloading Multiple Files

    Pass multiple URLs or use a file with URLs via -i.

    Example (Multiple URLs):

    wget https://example.com/file1.txt https://example.com/file2.txt

    Example (From File):

    Create urls.txt:

    https://example.com/file1.txt
    https://example.com/file2.txt

    Then:

    wget -i urls.txt
    • Downloads all listed files.
  • Limiting Download Speed

    Use --limit-rate to cap bandwidth usage.

    wget --limit-rate=200k https://example.com/largefile.zip
    • Limits speed to 200 KB/s (k = KB, m = MB).
  • Recursive Download

    Use -r to download a website or directory recursively.

    wget -r https://example.com/docs/
    • Downloads docs/ and its contents (e.g., HTML, images).
    • Use -l to limit depth:
      wget -r -l 2 https://example.com/docs/
      • Goes 2 levels deep.
  • Background Download

    Use -b to run wget in the background.

    wget -b https://example.com/largefile.zip
    • Output goes to wget-log; check progress with tail -f wget-log.

To get help related to the wget command use --help option

For more details, check the manual with man wget